Learning from Examples with Information Theoretic Criteria

نویسندگان

  • José Carlos Príncipe
  • Dongxin Xu
  • Qun Zhao
  • John W. Fisher
چکیده

This paper discusses a framework for learning based on information theoretic criteria. A novel algorithm based on Renyi’s quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization. We provide an intriguing analogy between the computation and an information potential measuring the interactions among the data samples. We also propose two approximations to the Kulback-Leibler divergence based on quadratic distances (Cauchy-Schwartz inequality and Euclidean distance). These distances can still be computed using the information potential. We test the newly proposed distances in blind source separation (unsupervised learning) and in feature extraction for classification (supervised learning). In blind source separation our algorithm is capable of separating instantaneously mixed sources, and for classification the performance of our classifier is comparable to the support vector machines (SVMs).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Survey of Rate Distortion and Information Bottleneck from the Perspective of Unsupervised Learning

This paper will introduce the ideas of unsupervised machine learning as motivation for presenting the underlying information theoretic theory from the perspective of channel coding, then transition into a discussion of various extensions of information bottleneck ideas and algorithms to applications. In particular, several examples of the optimization models applied to toy problems are used to ...

متن کامل

Information-Theoretic Objective Functions for Lifelong Learning

Conventional paradigms of machine learning assume all the training data are available when learning starts. However, in lifelong learning, the examples are observed sequentially as learning unfolds, and the learner should continually explore the world and reorganize and refine the internal model or knowledge of the world. This leads to a fundamental challenge: How to balance long-term and short...

متن کامل

Information Theoretic Learning

of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy INFORMATION THEORETIC LEARNING: RENYI’S ENTROPY AND ITS APPLICATIONS TO ADAPTIVE SYSTEM TRAINING By Deniz Erdogmus May 2002 Chairman: Dr. Jose C. Principe Major Department: Electrical and Computer Engineering Traditionally, second-order ...

متن کامل

Information-Theoretic Learning Using Renyi’s Quadratic Entropy

Learning from examples has been traditionally based on correlation or on the mean square error (MSE) criterion, in spite of the fact that learning is intrinsically related with the extraction of information from examples. The problem is that Shannon’s Information Entropy, which has a sound theoretical foundation, is not easy to implement in a learning from examples scenario. In this paper, Reny...

متن کامل

NEW CRITERIA FOR RULE SELECTION IN FUZZY LEARNING CLASSIFIER SYSTEMS

Designing an effective criterion for selecting the best rule is a major problem in theprocess of implementing Fuzzy Learning Classifier (FLC) systems. Conventionally confidenceand support or combined measures of these are used as criteria for fuzzy rule evaluation. In thispaper new entities namely precision and recall from the field of Information Retrieval (IR)systems is adapted as alternative...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • VLSI Signal Processing

دوره 26  شماره 

صفحات  -

تاریخ انتشار 2000